AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Wikipedia pretrained

# Wikipedia pretrained

Roberta Base Indonesian 522M
MIT
An Indonesian pretrained model based on RoBERTa-base architecture, trained on Indonesian Wikipedia data, case insensitive.
Large Language Model Other
R
cahya
454
6
Distilbert Base En Fr Es De Zh Cased
Apache-2.0
This is a lightweight version of distilbert-base-multilingual-cased, supporting five languages: English, French, Spanish, German, and Chinese, while maintaining original accuracy.
Large Language Model Transformers Supports Multiple Languages
D
Geotrend
35
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase